Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
                                            Some full text articles may not yet be available without a charge during the embargo (administrative interval).
                                        
                                        
                                        
                                            
                                                
                                             What is a DOI Number?
                                        
                                    
                                
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
- 
            Using unmanned aerial vehicles (UAVs) to track multiple individuals simultaneously in their natural environment is a powerful approach for better understanding the collective behavior of primates. Previous studies have demonstrated the feasibility of automating primate behavior classification from video data, but these studies have been carried out in captivity or from ground-based cameras. However, to understand group behavior and the self-organization of a collective, the whole troop needs to be seen at a scale where behavior can be seen in relation to the natural environment in which ecological decisions are made. To tackle this challenge, this study presents a novel dataset for baboon detection, tracking, and behavior recognition from drone videos where troops are observed on-the-move in their natural environment as they move to and from their sleeping sites. Videos were captured from drones at Mpala Research Centre, a research station located in Laikipia County, in central Kenya. The baboon detection dataset was created by manually annotating all baboons in drone videos with bounding boxes. A tiling method was subsequently applied to create a pyramid of images at various scales from the original 5.3K resolution images, resulting in approximately 30K images used for baboon detection. The baboon tracking dataset is derived from the baboon detection dataset, where bounding boxes are consistently assigned the same ID throughout the video. This process resulted in half an hour of dense tracking data. The baboon behavior recognition dataset was generated by converting tracks into mini-scenes, a video subregion centered on each animal. These mini-scenes were annotated with 12 distinct behavior types and one additional category for occlusion, resulting in over 20 hours of data. Benchmark results show mean average precision (mAP) of 92.62% for the YOLOv8-X detection model, multiple object tracking precision (MOTP) of 87.22% for the DeepSORT tracking algorithm, and micro top-1 accuracy of 64.89% for the X3D behavior recognition model. Using deep learning to rapidly and accurately classify wildlife behavior from drone footage facilitates non-invasive data collection on behavior enabling the behavior of a whole group to be systematically and accurately recorded. The dataset can be accessed at https://baboonland.xyz.more » « lessFree, publicly-accessible full text available June 16, 2026
- 
            Abstract Drones have become invaluable tools for studying animal behaviour in the wild, enabling researchers to collect aerial video data of group‐living animals. However, manually piloting drones to track animal groups consistently is challenging due to complex factors such as terrain, vegetation, group spread and movement patterns. The variability in manual piloting can result in unusable data for downstream behavioural analysis, making it difficult to collect standardized datasets for studying collective animal behaviour.To address these challenges, we present WildWing, a complete hardware and software open‐source unmanned aerial system (UAS) for autonomously collecting behavioural video data of group‐living animals. The system's main goal is to automate and standardize the collection of high‐quality aerial footage suitable for computer vision‐based behaviour analysis. We provide a novel navigation policy to autonomously track animal groups while maintaining optimal camera angles and distances for behavioural analysis, reducing the inconsistencies inherent in manual piloting.The complete WildWing system costs only $650 and incorporates drone hardware with custom software that integrates ecological knowledge into autonomous navigation decisions. The system produces 4 K resolution video at 30 fps while automatically maintaining appropriate distances and angles for behaviour analysis. We validate the system through field deployments tracking groups of Grevy's zebras, giraffes and Przewalski's horses at The Wilds conservation centre, demonstrating its ability to collect usable behavioural data consistently.By automating the data collection process, WildWing helps ensure consistent, high‐quality video data suitable for computer vision analysis of animal behaviour. This standardization is crucial for developing robust automated behaviour recognition systems to help researchers study and monitor wildlife populations at scale. The open‐source nature of WildWing makes autonomous behavioural data collection more accessible to researchers, enabling wider application of drone‐based behavioural monitoring in conservation and ecological research.more » « lessFree, publicly-accessible full text available March 10, 2026
- 
            In this paper, we extend the dataset statistics, model benchmarks, and performance analysis for the recently published KABR dataset, an in situ dataset for ungulate behavior recognition using aerial footage from the Mpala Research Centre in Kenya. The dataset comprises video footage of reticulated giraffes (lat. Giraffa reticulata), Plains zebras (lat. Equus quagga), and Grévy’s zebras (lat. Equus grevyi) captured using a DJI Mavic 2S drone. It includes both spatiotemporal (i.e., mini-scenes) and behavior annotations provided by an expert behavioral ecologist. In total, KABR has more than 10 hours of annotated video. We extend the previous work in four key areas by: (i) providing comprehensive dataset statistics to reveal new insights into the data distribution across behavior classes and species; (ii) extending the set of existing benchmark models to include a new state-of-the-art transformer; (iii) investigating weight initialization strategies and exploring whether pretraining on human action recognition datasets is transferable to in situ animal behavior recognition directly (i.e., zero-shot) or as initialization for end-to-end model training; and (iv) performing a detailed statistical analysis of the performance of these models across species, behavior, and formally defined segments of the long-tailed distribution. The KABR dataset addresses the limitations of previous datasets sourced from controlled environments, offering a more authentic representation of natural animal behaviors. This work marks a significant advancement in the automatic analysis of wildlife behavior, leveraging drone technology to overcome traditional observational challenges and enabling a more nuanced understanding of animal interactions in their natural habitats. The dataset is available at https://kabrdata.xyzmore » « lessFree, publicly-accessible full text available December 21, 2025
- 
            In situ imageomics is a new approach to study ecological, biological and evolutionary systems wherein large image and video data sets are captured in the wild and machine learning methods are used to infer biological traits of individual organisms, animal social groups, species, and even whole ecosystems. Monitoring biological traits over large spaces and long periods of time could enable new, data-driven approaches to wildlife conservation, biodiversity, and sustainable ecosystem management. However, to accurately infer biological traits, machine learning methods for images require voluminous and high quality data. Adaptive, data-driven approaches are hamstrung by the speed at which data can be captured and processed. Camera traps and unmanned aerial vehicles (UAVs) produce voluminous data, but lose track of individuals over large areas, fail to capture social dynamics, and waste time and storage on images with poor lighting and view angles. In this vision paper, we make the case for a research agenda for in situ imageomics that depends on significant advances in autonomic and self-aware computing. Precisely, we seek autonomous data collection that manages camera angles, aircraft positioning, conflicting actions for multiple traits of interest, energy availability, and cost factors. Given the tools to detect object and identify individuals, we propose a research challenge: Which optimization model should the data collection system employ to accurately identify, characterize, and draw inferences from biological traits while respecting a budget? Using zebra and giraffe behavioral data collected over three weeks at the Mpala Research Centre in Laikipia County, Kenya, we quantify the volume and quality of data collected using existing approaches. Our proposed autonomic navigation policy for in situ imageomics collection has an F1 score of 82% compared to an expert pilot, and provides greater safety and consistency, suggesting great potential for state-of-the-art autonomic approaches if they can be scaled up to fully address the problem.more » « less
- 
            Although the COVID-19 vaccine has dramatically changed the fight against the pandemic, many exhibit vaccination-hesitancy. At the same time, continued human-induced emissions of greenhouse gases pose an alarming threat to humanity. Based on the theory of Subjective Expected Relative Similarity (SERS) and a recent international study that drastically modified COVID-19 health-related attitudes, we explain why a similar approach and a corresponding public policy are expected to help resolve both behavioural issues: reduce vaccination hesitancy and motivate climate actions.more » « less
- 
            Human health and livelihoods are threatened by declining marine fisheries catches, causing substantial interest in the sources and dynamics of fishing. Catch analyses in individual exclusive economic zones (EEZs) and the high seas are abundant, and research across multiple EEZs is growing. However, no previous studies have systematically compared catches, intranational versus international fish flows, and fishing nations within all of the world’s EEZs and across adjacent and distant EEZs and the high seas to inform “metacoupled” fisheries management. We use the metacoupling framework—a new approach for evaluating human–nature interactions within and across adjacent and distant systems (metacouplings)—to illustrate how fisheries catches were locally, regionally, and globally interconnected in 1950–2014, totaling 5.8 billion metric tons and increasing by 298% (tonnage) and 431% (monetary value) over this time period. Catches by nations in their own EEZs (largest in Peru) and adjacent EEZs (largest in Indonesia) constituted 86% of worldwide catches, growing in 1950–1996 but declining in 1997–2014. In contrast, catches in distant EEZs and the high seas—largest in Morocco, Mauritania, and Canada—peaked in 1973 and have since represented 9–21% of annual catches. Our 65-year, local–regional–global analysis illustrates how metacoupled fisheries governance—holistic management of multiscalar catches, flows, and tradeoffs within and among fisheries—can improve food and nutrition security, livelihood resilience, and biodiversity conservation across the world.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
